MedBERT is a Transformer-based pretrained language model specifically designed for biomedical named entity recognition tasks. It is initialized based on Bio_ClinicalBERT and pretrained on multiple biomedical datasets.
Sequence Labeling
Transformers English